13 research outputs found

    Mid to Late Season Weed Detection in Soybean Production Fields Using Unmanned Aerial Vehicle and Machine Learning

    Get PDF
    Mid-late season weeds are those that escape the early season herbicide applications and those that emerge late in the season. They might not affect the crop yield, but if uncontrolled, will produce a large number of seeds causing problems in the subsequent years. In this study, high-resolution aerial imagery of mid-season weeds in soybean fields was captured using an unmanned aerial vehicle (UAV) and the performance of two different automated weed detection approaches – patch-based classification and object detection was studied for site-specific weed management. For the patch-based classification approach, several conventional machine learning models on Haralick texture features were compared with the Mobilenet v2 based convolutional neural network (CNN) model for their classification performance. The results showed that the CNN model had the best classification performance for individual patches. Two different image slicing approaches – patches with and without overlap were tested, and it was found that slicing with overlap leads to improved weed detection but with higher inference time. For the object detection approach, two models with different network architectures, namely Faster RCNN and SSD were evaluated and compared. It was found that Faster RCNN had better overall weed detection performance than the SSD with similar inference time. Also, it was found that Faster RCNN had better detection performance and shorter inference time compared to the patch-based CNN with overlapping image slicing. The influence of spatial resolution on weed detection accuracy was investigated by simulating the UAV imagery captured at different altitudes. It was found that Faster RCNN achieves similar performance at a lower spatial resolution. The inference time of Faster RCNN was evaluated using a regular laptop. The results showed the potential of on-farm near real-time weed detection in soybean production fields by capturing UAV imagery with lesser overlap and processing them with a pre-trained deep learning model, such as Faster RCNN, in regular laptops and mobile devices. Advisor: Yeyin Sh

    Mid to Late Season Weed Detection in Soybean Production Fields Using Unmanned Aerial Vehicle and Machine Learning

    Get PDF
    Mid-late season weeds are those that escape the early season herbicide applications and those that emerge late in the season. They might not affect the crop yield, but if uncontrolled, will produce a large number of seeds causing problems in the subsequent years. In this study, high-resolution aerial imagery of mid-season weeds in soybean fields was captured using an unmanned aerial vehicle (UAV) and the performance of two different automated weed detection approaches – patch-based classification and object detection was studied for site-specific weed management. For the patch-based classification approach, several conventional machine learning models on Haralick texture features were compared with the Mobilenet v2 based convolutional neural network (CNN) model for their classification performance. The results showed that the CNN model had the best classification performance for individual patches. Two different image slicing approaches – patches with and without overlap were tested, and it was found that slicing with overlap leads to improved weed detection but with higher inference time. For the object detection approach, two models with different network architectures, namely Faster RCNN and SSD were evaluated and compared. It was found that Faster RCNN had better overall weed detection performance than the SSD with similar inference time. Also, it was found that Faster RCNN had better detection performance and shorter inference time compared to the patch-based CNN with overlapping image slicing. The influence of spatial resolution on weed detection accuracy was investigated by simulating the UAV imagery captured at different altitudes. It was found that Faster RCNN achieves similar performance at a lower spatial resolution. The inference time of Faster RCNN was evaluated using a regular laptop. The results showed the potential of on-farm near real-time weed detection in soybean production fields by capturing UAV imagery with lesser overlap and processing them with a pre-trained deep learning model, such as Faster RCNN, in regular laptops and mobile devices. Advisor: Yeyin Sh

    Elucidating Sorghum Biomass, Nitrogen and Chlorophyll Contents With Spectral and Morphological Traits Derived From Unmanned Aircraft System

    Get PDF
    Unmanned aircraft systems (UAS) provide an efficient way to phenotype cropmorphology with spectral traits such as plant height, canopy cover and various vegetation indices (VIs) providing information to elucidate genotypic responses to the environment. In this study, we investigated the potential use of UAS-derived traits to elucidate biomass, nitrogen and chlorophyll content in sorghum under nitrogen stress treatments. A nitrogen stress trial located in Nebraska, USA, contained 24 different sorghum lines, 2 nitrogen treatments and 8 replications, for a total of 384 plots. Morphological and spectral traits including plant height, canopy cover and various VIs were derived from UAS flights with a true-color RGB camera and a 5-band multispectral camera at early, mid and late growth stages across the sorghum growing season in 2017. Simple and multiple regression models were investigated for sorghum biomass, nitrogen and chlorophyll content estimations using the derived morphological and spectral traits along with manual ground truthed measurements. Results showed that, the UAS-derived plant height was strongly correlated with manually measured plant height (r = 0.85); and the UAS-derived biomass using plant height, canopy cover and VIs had strong exponential correlations with the sampled biomass of fresh stalks and leaves (maximum r = 0.85) and the biomass of dry stalks and leaves (maximum r = 0.88). The UAS-derived VIs were moderately correlated with the laboratory measured leaf nitrogen content (r = 0.52) and the measured leaf chlorophyll content (r = 0.69) in each plot. The methods developed in this study will facilitate genetic improvement and agronomic studies that require assessment of stress responses in large-scale field trials

    Elucidating Sorghum Biomass, Nitrogen and Chlorophyll Contents With Spectral and Morphological Traits Derived From Unmanned Aircraft System

    Get PDF
    Unmanned aircraft systems (UAS) provide an efficient way to phenotype crop morphology with spectral traits such as plant height, canopy cover and various vegetation indices (VIs) providing information to elucidate genotypic responses to the environment. In this study, we investigated the potential use of UAS-derived traits to elucidate biomass, nitrogen and chlorophyll content in sorghum under nitrogen stress treatments. A nitrogen stress trial located in Nebraska, USA, contained 24 different sorghum lines, 2 nitrogen treatments and 8 replications, for a total of 384 plots. Morphological and spectral traits including plant height, canopy cover and various VIs were derived from UAS flights with a true-color RGB camera and a 5-band multispectral camera at early, mid and late growth stages across the sorghum growing season in 2017. Simple and multiple regression models were investigated for sorghum biomass, nitrogen and chlorophyll content estimations using the derived morphological and spectral traits along with manual ground truthed measurements. Results showed that, the UAS-derived plant height was strongly correlated with manually measured plant height (r = 0.85); and the UAS-derived biomass using plant height, canopy cover and VIs had strong exponential correlations with the sampled biomass of fresh stalks and leaves (maximum r = 0.85) and the biomass of dry stalks and leaves (maximum r = 0.88). The UAS-derived VIs were moderately correlated with the laboratory measured leaf nitrogen content (r = 0.52) and the measured leaf chlorophyll content (r = 0.69) in each plot. The methods developed in this study will facilitate genetic improvement and agronomic studies that require assessment of stress responses in large-scale field trials

    Principal variable selection to explain grain yield variation in winter wheat from features extracted from UAV imagery

    Get PDF
    Background: Automated phenotyping technologies are continually advancing the breeding process. However, collecting various secondary traits throughout the growing season and processing massive amounts of data still take great efforts and time. Selecting a minimum number of secondary traits that have the maximum predictive power has the potential to reduce phenotyping efforts. The objective of this study was to select principal features extracted from UAV imagery and critical growth stages that contributed the most in explaining winter wheat grain yield. Five dates of multispectral images and seven dates of RGB images were collected by a UAV system during the spring growing season in 2018. Two classes of features (variables), totaling to 172 variables, were extracted for each plot from the vegetation index and plant height maps, including pixel statistics and dynamic growth rates. A parametric algorithm, LASSO regression (the least angle and shrinkage selection operator), and a non-parametric algorithm, random forest, were applied for variable selection. The regression coefficients estimated by LASSO and the permutation importance scores provided by random forest were used to determine the ten most important variables influencing grain yield from each algorithm. Results: Both selection algorithms assigned the highest importance score to the variables related with plant height around the grain filling stage. Some vegetation indices related variables were also selected by the algorithms mainly at earlier to mid growth stages and during the senescence. Compared with the yield prediction using all 172 variables derived from measured phenotypes, using the selected variables performed comparable or even better. We also noticed that the prediction accuracy on the adapted NE lines (r = 0.58–0.81) was higher than the other lines (r = 0.21–0.59) included in this study with different genetic backgrounds. Conclusions: With the ultra-high resolution plot imagery obtained by the UAS-based phenotyping we are now able to derive more features, such as the variation of plant height or vegetation indices within a plot other than just an averaged number, that are potentially very useful for the breeding purpose. However, too many features or variables can be derived in this way. The promising results from this study suggests that the selected set from those variables can have comparable prediction accuracies on the grain yield prediction than the full set of them but possibly resulting in a better allocation of efforts and resources on phenotypic data collection and processing

    Maize Tassel Detection From UAV Imagery Using Deep Learning

    Get PDF
    The timing of flowering plays a critical role in determining the productivity of agricultural crops. If the crops flower too early, the crop would mature before the end of the growing season, losing the opportunity to capture and use large amounts of light energy. If the crops flower too late, the crop may be killed by the change of seasons before it is ready to harvest. Maize flowering is one of the most important periods where even small amounts of stress can significantly alter yield. In this work, we developed and compared two methods for automatic tassel detection based on the imagery collected from an unmanned aerial vehicle, using deep learning models. The first approach was a customized framework for tassel detection based on convolutional neural network (TD-CNN). The other method was a state-of-the-art object detection technique of the faster region-based CNN (Faster R-CNN), serving as baseline detection accuracy. The evaluation criteria for tassel detection were customized to correctly reflect the needs of tassel detection in an agricultural setting. Although detecting thin tassels in the aerial imagery is challenging, our results showed promising accuracy: the TD-CNN had an F1 score of 95.9% and the Faster R-CNN had 97.9% F1 score. More CNN-based model structures can be investigated in the future for improved accuracy, speed, and generalizability on aerial-based tassel detection

    WayFAST: Navigation with Predictive Traversability in the Field

    Full text link
    We present a self-supervised approach for learning to predict traversable paths for wheeled mobile robots that require good traction to navigate. Our algorithm, termed WayFAST (Waypoint Free Autonomous Systems for Traversability), uses RGB and depth data, along with navigation experience, to autonomously generate traversable paths in outdoor unstructured environments. Our key inspiration is that traction can be estimated for rolling robots using kinodynamic models. Using traction estimates provided by an online receding horizon estimator, we are able to train a traversability prediction neural network in a self-supervised manner, without requiring heuristics utilized by previous methods. We demonstrate the effectiveness of WayFAST through extensive field testing in varying environments, ranging from sandy dry beaches to forest canopies and snow covered grass fields. Our results clearly demonstrate that WayFAST can learn to avoid geometric obstacles as well as untraversable terrain, such as snow, which would be difficult to avoid with sensors that provide only geometric data, such as LiDAR. Furthermore, we show that our training pipeline based on online traction estimates is more data-efficient than other heuristic-based methods.Comment: Project website with code and videos: https://mateusgasparino.com/wayfast-traversability-navigation/ Published in the IEEE Robotics and Automation Letters (RA-L, 2022) Accepted for presentation in the 2022 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2022

    Principal variable selection to explain grain yield variation in winter wheat from features extracted from UAV imagery

    Get PDF
    Background: Automated phenotyping technologies are continually advancing the breeding process. However, collecting various secondary traits throughout the growing season and processing massive amounts of data still take great efforts and time. Selecting a minimum number of secondary traits that have the maximum predictive power has the potential to reduce phenotyping efforts. The objective of this study was to select principal features extracted from UAV imagery and critical growth stages that contributed the most in explaining winter wheat grain yield. Five dates of multispectral images and seven dates of RGB images were collected by a UAV system during the spring growing season in 2018. Two classes of features (variables), totaling to 172 variables, were extracted for each plot from the vegetation index and plant height maps, including pixel statistics and dynamic growth rates. A parametric algorithm, LASSO regression (the least angle and shrinkage selection operator), and a non-parametric algorithm, random forest, were applied for variable selection. The regression coefficients estimated by LASSO and the permutation importance scores provided by random forest were used to determine the ten most important variables influencing grain yield from each algorithm. Results: Both selection algorithms assigned the highest importance score to the variables related with plant height around the grain filling stage. Some vegetation indices related variables were also selected by the algorithms mainly at earlier to mid growth stages and during the senescence. Compared with the yield prediction using all 172 variables derived from measured phenotypes, using the selected variables performed comparable or even better. We also noticed that the prediction accuracy on the adapted NE lines (r = 0.58–0.81) was higher than the other lines (r = 0.21–0.59) included in this study with different genetic backgrounds. Conclusions: With the ultra-high resolution plot imagery obtained by the UAS-based phenotyping we are now able to derive more features, such as the variation of plant height or vegetation indices within a plot other than just an averaged number, that are potentially very useful for the breeding purpose. However, too many features or variables can be derived in this way. The promising results from this study suggests that the selected set from those variables can have comparable prediction accuracies on the grain yield prediction than the full set of them but possibly resulting in a better allocation of efforts and resources on phenotypic data collection and processing
    corecore